Concentration of Norms and Eigenvalues of Random Matrices
نویسندگان
چکیده
In this paper we prove concentration results for norms of rectangular random matrices acting as operators between lp spaces, and eigenvalues of self-adjoint random matrices. Except for the self-adjointness condition when we consider eigenvalues, the only assumptions on the distribution of the matrix entries are independence and boundedness. Our approach is based on a powerful isoperimetric inequality for product probability spaces due to Talagrand [20]. Throughout this paper X = Xm,n will stand for an m × n random matrix with real or complex entries xjk. (Specific technical conditions on the xjk’s will be introduced as needed for each result below.) If 1 ≤ p, q ≤ ∞ and A is an m× n matrix, we denote by ‖A‖p→q the operator norm of A : lp → lq . We denote by p′ = p/(p − 1) the conjugate exponent of p. For a real random variable Y we denote by EY the expected value and by MY any median of Y . Our first main result is the following.
منابع مشابه
Spectral Distributions of Adjacency and Laplacian Matrices of Random Graphs By
In this paper, we investigate the spectral properties of the adjacency and the Laplacian matrices of random graphs. We prove that: (i) the law of large numbers for the spectral norms and the largest eigenvalues of the adjacency and the Laplacian matrices; (ii) under some further independent conditions, the normalized largest eigenvalues of the Laplacian matrices are dense in a compact interval ...
متن کاملSpectral Distributions of Adjacency and Laplacian Matrices of Random Graphs
In this paper, we investigate the spectral properties of the adjacency and the Laplacian matrices of random graphs. We prove that (i) the law of large numbers for the spectral norms and the largest eigenvalues of the adjacency and the Laplacian matrices; (ii) under some further independent conditions, the normalized largest eigenvalues of the Laplacian matrices are dense in a compact interval a...
متن کاملA mathematically simple method based on denition for computing eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices
In this paper, a fundamentally new method, based on the denition, is introduced for numerical computation of eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices. Some examples are provided to show the accuracy and reliability of the proposed method. It is shown that the proposed method gives other sequences than that of existing methods but they still are convergent to th...
متن کاملAPPLICATION OF THE RANDOM MATRIX THEORY ON THE CROSS-CORRELATION OF STOCK PRICES
The analysis of cross-correlations is extensively applied for understanding of interconnections in stock markets. Variety of methods are used in order to search stock cross-correlations including the Random Matrix Theory (RMT), the Principal Component Analysis (PCA) and the Hierachical Structures. In this work, we analyze cross-crrelations between price fluctuations of 20 company stocks...
متن کاملStatistical Mechanics and Random Matrices
Statistical Mechanics and Random Matrices 3 1. Introduction 6 2. Motivations 7 3. The different scales; typical results 12 Lecture 1. Wigner matrices and moments estimates 15 1. Wigner's theorem 16 2. Words in several independent Wigner matrices 23 3. Estimates on the largest eigenvalue of Wigner matrices 25 Lecture 2. Gaussian Wigner matrices and Fredholm determinants 27 1. Joint law of the ei...
متن کامل